2024-08-15 10:40:39.AIbase.11.1k
Claude Introduces 'Prompt Caching' Feature Allowing Developers to Cache Common Contexts on API
Anthropic has announced the launch of a new 'Prompt Caching' feature in its Claude series of large language models, aimed at significantly reducing AI costs for enterprises and enhancing performance. This feature allows users to store and reuse specific context information without incurring additional costs or delays. In public testing, the Claude 3.5 Sonnet and Claude 3 Haiku models will implement this feature, expected to achieve up to 90% cost reduction and double the response speed in certain scenarios. This feature is particularly suited for use cases that require...